09. Udacity Workspace Best Practices
## Best Practices
Follow the best practices outlined below to avoid common issues with Workspaces.
Keep your home folder small
Your home folder (including subfolders) must be less than 2GB or you may lose data when your session terminates. You may use directories outside of the home folder for more space, but only the contents of the home folder are persisted between sessions and submitted with your project.
NOTE: Your home folder (including subfolders) must be less than 25 megabytes to submit as a project. If the site becomes unresponsive when you try to submit your project, it is likely that your home folder is too large. You can check the size of your home folder by opening a terminal and running the commanddu -h . | tail -1
You can usels
to list the files in your terminal andrm
to remove unwanted files. (Search for both commands online to find example usage.)- ##### What's the "home folder"?
"Home folder" refers to the directory where users files are stored (compared to locations where system files are stored, for example). (Ref. Wikipedia: home directory) In Workspaces, the home folder is/home/workspace
. Any files in this folder or any subfolder are part of your home folder contents, which means they're saved between sessions and transferred automatically when you switch between CPU/GPU mode.
The folder/tmp
is not in the home folder; files in any folder outside your home folder are not persisted between sessions or transferred between CPU/GPU mode. You can create a folder outside the home folder using the commandmkdir
from a terminal. For example you could create a temporary folder to store data usingmkdir -p /data
to create a folder at the root directory. You will need to recreate the folder and recreate any data inside every time you start a new Workspace session.
- ##### What's the "home folder"?
Keeping your connection alive during long processes
Workspaces automatically disconnect when the connection is inactive for about 30 minutes, which includes inactivity while deep learning models are training. The workspace_utils.py module here has been set up to keep your connection alive during training.
NOTE: The script sometimes raises a connection error if the request is opened too frequently; just restart the jupyter kernel & run the cells again to reset the error.
NOTE: These scripts will keep your connection alive while the training process is running, but the workspace will still disconnect 30 minutes after the last notebook cell finishes. Modify the notebook cells to save your work at the end of the last cell or else you'll lose all progress when the workspace terminates.Manage your GPU time
It is important to avoid wasting GPU time in Workspace projects that have GPU acceleration enabled. The benefits of GPU acceleration are most useful when evaluating deep learning models—especially during training. In most cases, you can build and test your model (including data pre-processing, defining model architecture, etc.) in CPU mode, then activate GPU mode to accelerate training.
### Handling "Out of Memory" errors
This issue isn't specific to Workspaces, but rather it is an apparent issue between Pytorch & Jupyter, where Jupyter reports "out of memory" after a cell crashes. Jupyter holds references to active objects as long as the kernel is running—including objects created before an error is raised. This can cause Jupyter to persist large objects in memory long after they are no longer required. The only known solution so far is to reset the kernel and run the notebook cells again.